Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity

We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/ √ T ) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor’s classic subgradient analysis and implies generalizations of the standard convergenc...

متن کامل

Stochastic Subgradient Methods

Stochastic subgradient methods play an important role in machine learning. We introduced the concepts of subgradient methods and stochastic subgradient methods in this project, discussed their convergence conditions as well as the strong and weak points against their competitors. We demonstrated the application of (stochastic) subgradient methods to machine learning with a running example of tr...

متن کامل

Stochastic Subgradient MCMC Methods

Many Bayesian models involve continuous but non-differentiable log-posteriors, including the sparse Bayesian methods with a Laplace prior and the regularized Bayesian methods with maxmargin posterior regularization that acts like a likelihood term. In analogy to the popular stochastic subgradient methods for deterministic optimization, we present the stochastic subgradient MCMC for efficient po...

متن کامل

Adaptive Subgradient Methods Adaptive Subgradient Methods for Online Learning and Stochastic Optimization

We present a new family of subgradient methods that dynamically incorporate knowledge of the geometry of the data observed in earlier iterations to perform more informative gradientbased learning. Metaphorically, the adaptation allows us to find needles in haystacks in the form of very predictive but rarely seen features. Our paradigm stems from recent advances in stochastic optimization and on...

متن کامل

Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization

Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Optimization

سال: 2019

ISSN: 1052-6234,1095-7189

DOI: 10.1137/18m117306x